Goto

Collaborating Authors

 symmetric network



Constructing Proofs in Symmetric Networks

Neural Information Processing Systems

This paper considers the problem of expressing predicate calculus in con(cid:173) nectionist networks that are based on energy minimization. Given a first(cid:173) order-logic knowledge base and a bound k, a symmetric network is con(cid:173) structed (like a Boltzman machine or a Hopfield network) that searches for a proof for a given query. If a resolution-based proof of length no longer than k exists, then the global minima of the energy function that is associated with the network represent such proofs. The network that is generated is of size cubic in the bound k and linear in the knowledge size. There are no restrictions on the type of logic formulas that can be represented.


Exponential Separations in Symmetric Neural Networks

Zweig, Aaron, Bruna, Joan

arXiv.org Artificial Intelligence

The modern success of deep learning can in part be attributed to architectures that enforce appropriate invariance. Invariance to permutation of the input, i.e. treating the input as an unordered set, is a desirable property when learning symmetric functions in such fields as particle physics and population statistics. The simplest architectures that enforce permutation invariance treat each set element individually without allowing for interaction, as captured by the popular DeepSet model [18, 32]. Several architectures explicitly enable interaction between set elements, the simplest being the Relational Networks [21] that encode pairwise interaction. This may be understood as an instance of self-attention, the mechanism underlying Transformers [27], which have emerged as powerful generic neural network architectures to process a wide variety of data, from image patches to text to physical data. Specifically, Set Transformers [12] are special instantiations of Transformers, made permutation equivariant by omitting positional encoding of inputs, and using self-attention for pooling. 1


Hierarchical Clustering of Asymmetric Networks

Carlsson, Gunnar, Mémoli, Facundo, Ribeiro, Alejandro, Segarra, Santiago

arXiv.org Machine Learning

This paper considers networks where relationships between nodes are represented by directed dissimilarities. The goal is to study methods that, based on the dissimilarity structure, output hierarchical clusters, i.e., a family of nested partitions indexed by a connectivity parameter. Our construction of hierarchical clustering methods is built around the concept of admissible methods, which are those that abide by the axioms of value - nodes in a network with two nodes are clustered together at the maximum of the two dissimilarities between them - and transformation - when dissimilarities are reduced, the network may become more clustered but not less. Two particular methods, termed reciprocal and nonreciprocal clustering, are shown to provide upper and lower bounds in the space of admissible methods. Furthermore, alternative clustering methodologies and axioms are considered. In particular, modifying the axiom of value such that clustering in two-node networks occurs at the minimum of the two dissimilarities entails the existence of a unique admissible clustering method.



Constructing Proofs in Symmetric Networks

Pinkus, Gadi

Neural Information Processing Systems

This paper considers the problem of expressing predicate calculus in connectionist networksthat are based on energy minimization. Given a firstorder-logic knowledgebase and a bound k, a symmetric network is constructed (like a Boltzman machine or a Hopfield network) that searches for a proof for a given query. If a resolution-based proof of length no longer than k exists, then the global minima of the energy function that is associated with the network represent such proofs. The network that is generated is of size cubic in the bound k and linear in the knowledge size. There are no restrictions on the type of logic formulas that can be represented.


Dynamics of Analog Neural Networks with Time Delay

Marcus, Charles M., Westervelt, R. M.

Neural Information Processing Systems

A time delay in the response of the neurons in a network can induce sustained oscillation and chaos. We present a stability criterion based on local stability analysis to prevent sustained oscillation in symmetric delay networks, and show an example of chaotic dynamics in a non-symmetric delay network.


Dynamics of Analog Neural Networks with Time Delay

Marcus, Charles M., Westervelt, R. M.

Neural Information Processing Systems

A time delay in the response of the neurons in a network can induce sustained oscillation and chaos. We present a stability criterion based on local stability analysis to prevent sustained oscillation in symmetric delay networks, and show an example of chaotic dynamics in a non-symmetric delay network.


Dynamics of Analog Neural Networks with Time Delay

Marcus, Charles M., Westervelt, R. M.

Neural Information Processing Systems

A time delay in the response of the neurons in a network can induce sustained oscillation and chaos. We present a stability criterion based on local stability analysis to prevent sustained oscillation in symmetric delay networks, and show an example of chaotic dynamics in a non-symmetric delay network.